Goto

Collaborating Authors

 hyper-graph-network decoder


Reviews: Hyper-Graph-Network Decoders for Block Codes

Neural Information Processing Systems

In this paper, the authors propose to use a fully-connected NN to improve the BP decoding for block codes of regular degree distribution. The results are quite interesting because it shows that we can do better than BP for these regular codes by weighting the different contributions coming from the parity check. In a way, it tells each bit which parity check should trust more when doing each BP step and it allows the modified BP algorithm to converge faster and more accurately to the right code word. The gains are marginal, but given how good BP typically is that should not come as a surprise and should not be held against the paper. I have several comments about the paper that I would like to be addressed in the final version.


Reviews: Hyper-Graph-Network Decoders for Block Codes

Neural Information Processing Systems

This paper proposes a neural-network-based decoder architecture binary linear block codes with constant-degree variable nodes. It is based on message passing on the unfolded Tanner graph but replaces the variable-node operation in each iteration with a neural network g, whose parameters are provided by another neural network f which takes the absolute values of the messages as its input. Experimental results are provided to demonstrate that the proposed scheme performs well for various different types of codes. Although the review scores were around the acceptance threshold in the initial round of review, after the authors' rebuttal two reviewers have raised their scores, so that now all the reviewers are positive. I would thus like to recommend acceptance of this paper.


Hyper-Graph-Network Decoders for Block Codes

Neural Information Processing Systems

Neural decoders were shown to outperform classical message passing techniques for short BCH codes. In this work, we extend these results to much larger families of algebraic block codes, by performing message passing with graph neural networks. The parameters of the sub-network at each variable-node in the Tanner graph are obtained from a hypernetwork that receives the absolute values of the current message as input. To add stability, we employ a simplified version of the arctanh activation that is based on a high order Taylor approximation of this activation function. Our results show that for a large number of algebraic block codes, from diverse families of codes (BCH, LDPC, Polar), the decoding obtained with our method outperforms the vanilla belief propagation method as well as other learning techniques from the literature.


Hyper-Graph-Network Decoders for Block Codes

Nachmani, Eliya, Wolf, Lior

Neural Information Processing Systems

Neural decoders were shown to outperform classical message passing techniques for short BCH codes. In this work, we extend these results to much larger families of algebraic block codes, by performing message passing with graph neural networks. The parameters of the sub-network at each variable-node in the Tanner graph are obtained from a hypernetwork that receives the absolute values of the current message as input. To add stability, we employ a simplified version of the arctanh activation that is based on a high order Taylor approximation of this activation function. Our results show that for a large number of algebraic block codes, from diverse families of codes (BCH, LDPC, Polar), the decoding obtained with our method outperforms the vanilla belief propagation method as well as other learning techniques from the literature.